q
q
on. Therefore, some more advanced sequence comparison
ms were developed more than a decade ago, i.e., comparing
s based on the sequence statistics such as the word library or the
rary [Gao and Qi, 2007; Sims, et al., 2009; Wang, et al., 2009;
d Kim, 2011; Kolekar, et al., 2012; Hatje and Kollmar, 2012;
er, et al., 2014]. This type of algorithms is named as the
t-free sequence comparison approaches and has played a very
t role in many modern whole-genome pattern discovery projects.
ddition, the quantum computing technique has also been
ed for sequence homology alignment [Huo, et al., 2008; Prousalis
ofaos, 2019] and whole-genome de novo sequence assembly
d Nam, 2018].
global optimisation pattern discovery problem
achine learning algorithms are originally based on the Newton’s
which is also called the greatest descent method or the greedy
ethod. To estimate model parameters for a machine learning
numerical objective function, which is differentiable, needs to be
ed at first. The basic principle is to find a saddle point on the
function curve. At the saddle point, the objection function is
d. The most useful method of detecting a saddle point within an
function curve is based on the derivative function of the
function. This is why an objective function must be
able. A saddle point is always residing at the point in which the
vative function is zeroed and the objective function value at a
int is minimised.
arch for a saddle point within an objective function curve to
the parameters for a complicated model is called the Newton’s
Fletcher, 1987; Nocedal and Wright, 1999]. With this method, no
here the initial point is, the method will lead the move to a saddle
the objective function curve very quickly and efficiently. The
rs of a model which have been mapped to a saddle point are called